Low-rank nonnegative tensor approximation via alternating projections and sketching
نویسندگان
چکیده
We show how to construct nonnegative low-rank approximations of tensors in Tucker and tensor train formats. use alternating projections between the orthant set tensors, using STHOSVD TTSVD algorithms, respectively, further accelerate randomized sketching. The numerical experiments on both synthetic data hyperspectral images decay negative elements that error resulting approximation is close initial obtained with TTSVD. proposed method for case superior previous ones terms computational complexity elements. case, best our knowledge, has not been studied before.
منابع مشابه
Low-rank Tensor Approximation
Approximating a tensor by another of lower rank is in general an ill posed problem. Yet, this kind of approximation is mandatory in the presence of measurement errors or noise. We show how tools recently developed in compressed sensing can be used to solve this problem. More precisely, a minimal angle between the columns of loading matrices allows to restore both existence and uniqueness of the...
متن کاملRecovery guarantee of weighted low-rank approximation via alternating minimization
Many applications require recovering a ground truth low-rank matrix from noisy observations of the entries. In practice, this is typically formulated as a weighted low-rank approximation problem and solved using non-convex optimization heuristics such as alternating minimization. Such non-convex techniques have few guarantees. Even worse, weighted low-rank approximation is NP-hard for even the ...
متن کاملNear Optimal Sketching of Low-Rank Tensor Regression
We study the least squares regression problem min Θ∈S D,R ‖AΘ− b‖2, where S D,R is the set of Θ for which Θ = ∑R r=1θ (r) 1 ◦ · · · ◦ θ (r) D for vectors θ (r) d ∈ Rpd for all r ∈ [R] and d ∈ [D], and ◦ denotes the outer product of vectors. That is, Θ is a low-dimensional, lowrank tensor. This is motivated by the fact that the number of parameters in Θ is only R ·Dd=1pd , which is significantly...
متن کاملOrthogonal Low Rank Tensor Approximation: Alternating Least Squares Method and Its Global Convergence
With the notable exceptions of two cases — that tensors of order 2, namely, matrices, always have best approximations of arbitrary low ranks and that tensors of any order always have the best rank-one approximation, it is known that high-order tensors may fail to have best low rank approximations. When the condition of orthogonality is imposed, even under the modest assumption that only one set...
متن کاملRelative Error Tensor Low Rank Approximation
We consider relative error low rank approximation of tensors with respect to the Frobenius norm. Namely, given an order-q tensor A ∈ R ∏q i=1 ni , output a rank-k tensor B for which ‖A − B‖F ≤ (1 + ) OPT, where OPT = infrank-k A′ ‖A − A‖F . Despite much success on obtaining relative error low rank approximations for matrices, no such results were known for tensors. One structural issue is that ...
متن کاملذخیره در منابع من
با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید
ژورنال
عنوان ژورنال: Computational & Applied Mathematics
سال: 2023
ISSN: ['1807-0302', '2238-3603']
DOI: https://doi.org/10.1007/s40314-023-02211-2